Computational Capabilities of Recurrent NARX Neural Networks - Systems, Man and Cybernetics, Part B, IEEE Transactions on

نویسندگان

  • Hava T. Siegelmann
  • Bill G. Horne
چکیده

Recently, fully connected recurrent neural networks have been proven to be computationally rich—at least as powerful as Turing machines. This work focuses on another network which is popular in control applications and has been found to be very effective at learning a variety of problems. These networks are based upon Nonlinear AutoRegressive models with eXogenous Inputs (NARX models), and are therefore called NARX networks. As opposed to other recurrent networks, NARX networks have a limited feedback which comes only from the output neuron rather than from hidden states. They are formalized by y(t) = (u(t nu); ; u(t 1); u(t); y(t ny); ; y(t 1)) where u(t) and y(t) represent input and output of the network at time t, nu and ny are the input and output order, and the function is the mapping performed by a Multilayer Perceptron. We constructively prove that the NARX networks with a finite number of parameters are computationally as strong as fully connected recurrent networks and thus Turing machines. We conclude that in theory one can use the NARX models, rather than conventional recurrent networks without any computational loss even though their feedback is limited. Furthermore, these results raise the issue of what amount of feedback or recurrence is necessary for any network to be Turing equivalent and what restrictions on feedback limit computational power.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A dual neural network for kinematic control of redundant robot manipulators

The inverse kinematics problem in robotics can be formulated as a time-varying quadratic optimization problem. A new recurrent neural network, called the dual network, is presented in this paper. The proposed neural network is composed of a single layer of neurons, and the number of neurons is equal to the dimensionality of the workspace. The proposed dual network is proven to be globally expon...

متن کامل

Two recurrent neural networks for local joint torque optimization of kinematically redundant manipulators

This paper presents two neural network approaches to real-time joint torque optimization for kinematically redundant manipulators. Two recurrent neural networks are proposed for determining the minimum driving joint torques of redundant manipulators for the eases without and with taking the joint torque limits into consideration, respectively. The first neural network is called the Lagrangian n...

متن کامل

A reference model approach to stability analysis of neural networks

In this paper, a novel methodology called a reference model approach to stability analysis of neural networks is proposed. The core of the new approach is to study a neural network model with reference to other related models, so that different modeling approaches can be combinatively used and powerfully cross-fertilized. Focused on two representative neural network modeling approaches (the neu...

متن کامل

A recurrent neural network for minimum infinity-norm kinematic control of redundant manipulators with an improved problem formulation and reduced architecture complexity

This paper presents an improved neural computation where scheme for kinematic control of redundant manipulators based on infinity-norm joint velocity minimization. Compared with a previous neural network approach to minimum infinity-non kinematic control, the present approach is less complex in terms of cost of architecture. The recurrent neural network explicitly minimizes the maximum componen...

متن کامل

Constructing hysteretic memory in neural networks

Hysteresis is a unique type of dynamic, which contains an important property, rate-independent memory. In addition to other memory-related studies such as time delay neural networks, recurrent networks, and reinforcement learning, rate-independent memory deserves further attention owing to its potential applications. In this paper, we attempt to define hysteretic memory (rate independent memory...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1997